Search results for "kernel methods"
showing 10 items of 13 documents
Improved Statistically Based Retrievals via Spatial-Spectral Data Compression for IASI Data
2019
In this paper, we analyze the effect of spatial and spectral compression on the performance of statistically based retrieval. Although the quality of the information is not com- pletely preserved during the coding process, experiments reveal that a certain amount of compression may yield a positive impact on the accuracy of retrievals. We unveil two strategies, both with interesting benefits: either to apply a very high compression, which still maintains the same retrieval performance as that obtained for uncompressed data; or to apply a moderate to high compression, which improves the performance. As a second contribution of this paper, we focus on the origins of these benefits. On the one…
Structured Output SVM for Remote Sensing Image Classification
2011
Traditional kernel classifiers assume independence among the classification outputs. As a consequence, each misclassification receives the same weight in the loss function. Moreover, the kernel function only takes into account the similarity between input values and ignores possible relationships between the classes to be predicted. These assumptions are not consistent for most of real-life problems. In the particular case of remote sensing data, this is not a good assumption either. Segmentation of images acquired by airborne or satellite sensors is a very active field of research in which one tries to classify a pixel into a predefined set of classes of interest (e.g. water, grass, trees,…
Estimating biophysical variable dependences with kernels
2010
This paper introduces a nonlinear measure of dependence between random variables in the context of remote sensing data analysis. The Hilbert-Schmidt Independence Criterion (HSIC) is a kernel method for evaluating statistical dependence. HSIC is based on computing the Hilbert-Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces. The HSIC empirical estimator is very easy to compute and has good theoretical and practical properties. We exploit the capabilities of HSIC to explain nonlinear dependences in two remote sensing problems: temperature estimation and chlorophyll concentration prediction from spectra. Results show that, when the relationshi…
Kernel manifold alignment for domain adaptation
2016
The wealth of sensory data coming from different modalities has opened numerous opportu- nities for data analysis. The data are of increasing volume, complexity and dimensionality, thus calling for new methodological innovations towards multimodal data processing. How- ever, multimodal architectures must rely on models able to adapt to changes in the data dis- tribution. Differences in the density functions can be due to changes in acquisition conditions (pose, illumination), sensors characteristics (number of channels, resolution) or different views (e.g. street level vs. aerial views of a same building). We call these different acquisition modes domains, and refer to the adaptation proble…
Multi-temporal and Multi-source Remote Sensing Image Classification by Nonlinear Relative Normalization
2016
Remote sensing image classification exploiting multiple sensors is a very challenging problem: data from different modalities are affected by spectral distortions and mis-alignments of all kinds, and this hampers re-using models built for one image to be used successfully in other scenes. In order to adapt and transfer models across image acquisitions, one must be able to cope with datasets that are not co-registered, acquired under different illumination and atmospheric conditions, by different sensors, and with scarce ground references. Traditionally, methods based on histogram matching have been used. However, they fail when densities have very different shapes or when there is no corres…
Gaussian Process Sensitivity Analysis for Oceanic Chlorophyll Estimation
2017
Source at https://doi.org/10.1109/JSTARS.2016.2641583. Gaussian process regression (GPR) has experienced tremendous success in biophysical parameter retrieval in the past years. The GPR provides a full posterior predictive distribution so one can derive mean and variance predictive estimates, i.e., point-wise predictions and associated confidence intervals. GPR typically uses translation invariant covariances that make the prediction function very flexible and nonlinear. This, however, makes the relative relevance of the input features hardly accessible, unlike in linear prediction models. In this paper, we introduce the sensitivity analysis of the GPR predictive mean and variance functions…
Explicit Recursive and Adaptive Filtering in Reproducing Kernel Hilbert Spaces
2014
This brief presents a methodology to develop recursive filters in reproducing kernel Hilbert spaces. Unlike previous approaches that exploit the kernel trick on filtered and then mapped samples, we explicitly define the model recursivity in the Hilbert space. For that, we exploit some properties of functional analysis and recursive computation of dot products without the need of preimaging or a training dataset. We illustrate the feasibility of the methodology in the particular case of the $\gamma$ -filter, which is an infinite impulse response filter with controlled stability and memory depth. Different algorithmic formulations emerge from the signal model. Experiments in chaotic and elect…
Explicit recursivity into reproducing kernel Hilbert spaces
2011
This paper presents a methodology to develop recursive filters in reproducing kernel Hilbert spaces (RKHS). Unlike previous approaches that exploit the kernel trick on filtered and then mapped samples, we explicitly define model recursivity in the Hilbert space. The method exploits some properties of functional analysis and recursive computation of dot products without the need of pre-imaging. We illustrate the feasibility of the methodology in the particular case of the gamma-filter, an infinite impulse response (IIR) filter with controlled stability and memory depth. Different algorithmic formulations emerge from the signal model. Experiments in chaotic and electroencephalographic time se…
Kernel methods and their derivatives: Concept and perspectives for the earth system sciences.
2020
Kernel methods are powerful machine learning techniques which implement generic non-linear functions to solve complex tasks in a simple way. They Have a solid mathematical background and exhibit excellent performance in practice. However, kernel machines are still considered black-box models as the feature mapping is not directly accessible and difficult to interpret.The aim of this work is to show that it is indeed possible to interpret the functions learned by various kernel methods is intuitive despite their complexity. Specifically, we show that derivatives of these functions have a simple mathematical formulation, are easy to compute, and can be applied to many different problems. We n…
Signal-to-noise ratio in reproducing kernel Hilbert spaces
2018
This paper introduces the kernel signal-to-noise ratio (kSNR) for different machine learning and signal processing applications}. The kSNR seeks to maximize the signal variance while minimizing the estimated noise variance explicitly in a reproducing kernel Hilbert space (rkHs). The kSNR gives rise to considering complex signal-to-noise relations beyond additive noise models, and can be seen as a useful signal-to-noise regularizer for feature extraction and dimensionality reduction. We show that the kSNR generalizes kernel PCA (and other spectral dimensionality reduction methods), least squares SVM, and kernel ridge regression to deal with cases where signal and noise cannot be assumed inde…